HCBS Survey Analysis

Participant & Provider Congruence

Presented to Implementation Advisory Group

2018.09.27

Overview

  • Context and Approach
  • Quality Measurement
  • Data Cleaning
  • Outlier Identification
  • Congruence / Mismatch Analysis
  • Next Steps
  • Opportunities for Feedback

Context and Approach

  • Longer-term strategy for use of HCBS data goes beyond compliance focus to quality
  • Focus on participants’ personal experience of settings and services
  • Sections 1915 (c), (i) and (k) of the HCBS Final Rule indicate that metrics need to be developed for quality improvement
  • Quality measures for HCBS are required for waiver renewal

Data Cleaning

  • Data sources: Qualtrics extract, BCAL AFCs, WSA, Survey Domains and Crosswalks
  • Cleaning and mapping of provider names to consistent organizations
  • The entire analysis, including data cleaning process, uses principles of reproducible research for durability and transparency.

Quality Measurement

  • In the broader realm of quality measurement, participant surveys fit within the experience of care domain.
  • According to AHRQ, experience of care measures tell us “whether something that should happen…actually happened or how often it happened
  • Such measures identify instances where an individual’s experience did not correspond with defined best practices.

Mismatch Analysis

  • Since we have provider survey data in addition to participant survey data, we can also evaluate Congruence: Is the participant’s experience of a setting or service consistent with the perception of the provider?
  • Example: Regarding the question ‘Do you pick the direct support workers who provide your services and supports?’, we can evaluate both (a) whether the participant was involved in choice, and (b) whether their perception differs from their provider.

Match ≠ Comply

The table below breaks down responses based on whether the answer(s) complied with HCBS Final Rule requirements:

Outlier Identification

Can we establish a baseline rate for mismatches across provider and/or PIHPs, such that a specific provider could be flagged as an outlier from this baseline rate?

Outlier Identification Methodology

The methodology developed with BHDDA HCBS Team includes the following steps:

  • focus on well-understood questions
  • group by size of provider
  • apply interquartile range (IQR) calculation

Well-Understood Questions

To decrease the likelihood that mismatches are due to poorly understood question phrasing, we remove items where >5% of responses were “I don’t know.”

Size of Provider

To account for differences in rate for providers based on the size of the denominator, we applied the outlier calculation separately for providers based on the number of matched surveys: Individual (1 survey), Small (2-13), Large (>=14)

Outlier Thresholds

The thresholds for identification of outliers are summarized below for each grouping size of provider:

Identifying Outliers

The chart below shows the number of organizations in each size grouping, broken down by outlier status:

Next Steps

  • Outlier identification reports, including technical assistance for use
  • Development of discrete quality metrics to support implementation of the HCBS Final Rule
  • Additional analysis integrated with SIS and related datasets
  • BHDDA HCBS team is listening for ideas of how to best use this information

Other Potential Uses

A short list of potential analyses includes:

  • What level of support is needed at a given program, related to specific HCBS domains?
  • Can we identify provider settings where there is high degree of mismatch and a high level of need in related areas?
  • Impact of heightened scrutiny on provider network
  • Relationship of needs to use of services in HCBS Settings
  • Open to feedback on these or new ideas!

Questions? Suggestions?

Developed by:

In collaboration with: